Hidden Markov Model} Induction by Bayesian Model Merging

نویسندگان

  • Andreas Stolcke
  • Stephen M. Omohundro
چکیده

This paper describes a technique for learning both the number of states and the topology of Hidden Markov Models from examples. The induction process starts with the most specific model consistent with the training data and generalizes by successively merging states. Both the choice of states to merge and the stopping criterion are guided by the Bayesian posterior probability. We compare our algorithm with the Baum-Welch method of estimating fixed-size models, and find that it can induce minimal HMMs from data in cases where fixed estimation does not converge or requires redundant parameters to converge.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning DYllamical Systems Using Hidden Markov Models

In this paper, we address the problem of learning models for complex processes under the assumption that the processes can be represented within the hidden Markov model (HMM) framework. Toward that aim, we investigate the strengths and weaknesses of two compet­ ing algorithms for learning HMMs: Baum-VVelch and Bayesian Model Merging. VVe offer insight into the reasons for the success or failure...

متن کامل

مدل یابی انتشار بیماری های عفونی بر اساس رویکرد آماری بیز

Background and Aim: Health surveillance systems are now paying more attention to infectious diseases, largely because of emerging and re-emerging infections. The main objective of this research is presenting a statistical method for modeling infectious disease incidence based on the Bayesian approach.Material and Methods: Since infectious diseases have two phases, namely epidemic and non-epidem...

متن کامل

Model Merging for Hidden Markov Model Induction

We describe a new technique for inducing the structure of Hidden Markov Models from data using a model merging algorithm. The process begins with a maximum likelihood HMM that directly encodes the training data. Successively more general models are produced by merging HMM states. A Bayesian posterior probability criterion is used to determine which states to merge and when to stop generalizing....

متن کامل

Best-first Model Merging for Hidden Markov Model Induction

This report describes a new technique for inducing the structure of Hidden Markov Models from data which is based on the generaìmodel merging' strategy (Omohundro 1992). The process begins with a maximum likelihood HMM that directly encodes the training data. Successively more general models are produced by merging HMM states. A Bayesian posterior probability criterion is used to determine whic...

متن کامل

Bayesian Learning of Probabilistic Language Models

The general topic of this thesis is the probabilistic modeling of language, in particular natural language. In probabilistic language modeling, one characterizes the strings of phonemes, words, etc. of a certain domain in terms of a probability distribution over all possible strings within the domain. Probabilistic language modeling has been applied to a wide range of problems in recent years, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1992